![]() METHOD OF RECOGNIZING FORMS, COMPUTER PROGRAM PRODUCT, AND MOBILE TERMINAL.
专利摘要:
公开号:BE1020588A5 申请号:E201200548 申请日:2012-08-13 公开日:2014-01-07 发明作者:Muelenaere Pierre De;Michel Dauw;Olivier Dupont;Patrick Verleysen 申请人:Iris Sa; IPC主号:
专利说明:
Shape recognition method, computer program product and mobile terminal Technical area The present invention relates to a pattern recognition method. The present invention also relates to a computer program product for implementing said pattern recognition method and a mobile terminal having said character recognition method in executable format on the mobile terminal. Prior art Optical Character Recognition (OCR) systems are known in the art. They convert the image of the printed text into machine readable code using a character recognition method. In an OCR system, images of what could be characters are isolated and a character recognition method is used to identify the character. A character recognition method, such as that shown in FIG. 1, generally comprises: (a) a feature extraction process 102 that extracts a feature vector from the character input image 101. (b) a classification process 103 that compares the feature vector with models 104 and assigns the feature vector to a class of a given set of classes, which is the output 105. In prior art ROC systems, the "classification process must not only produce a class but also alternative classes and confidence levels, and the ROC system includes a contextual decision system that will use this information with linguistic or typographical contextual information to produce the text with the best recognition. The set of characteristics that are calculated describes the shapes of the characters to be recognized. They must be discriminating, insensitive to distortion of characters and additional noise and offer reliable levels of confidence. On the other hand, some character recognition processes are based on matching patterns, but these character recognition processes can only recognize text written in a limited number of fonts. However, the confidence levels offered by these character recognition processes are normally more reliable than feature-based character recognition systems. Disclosure of the invention An object of this invention is to provide a pattern recognition method that provides reliable levels of confidence without being restricted to a limited number of fonts. Another object of this invention is to provide a pattern recognition method which is fast and fast enough to be integrated in a digital copier or used in a mobile terminal such as a smartphone or tablet PC. Another object of this invention is to provide a computer program product for implementing said pattern recognition method and a mobile terminal having said pattern recognition method in executable format on the mobile terminal. These objects are achieved according to the invention as described in the independent claims. As used herein, "pattern recognition" is meant to mean any form of recognition of a form or digital image such as characters or combinations of characters, graphics, sounds (eg in speech recognition ) or others, by means of any type of computer device. It should be noted that, when used for characters, the recognition system of this invention is not limited to character recognition but may also recognize multiple characters such as ligatures (e.g. ) or other characters touching (for example, rn, vv, parts of a character such as an accent (for example an acute accent in é) or even other graphic symbols. characters crossed by an underline or other graphic element. As used herein, "template" is meant to mean the combination of at least one binary image containing at least reliable bits of a recognized form and a class to which the template belongs. For example, a template for a lowercase "a" may contain a binary bit image of the "a" character in a given font, possibly binary images for normal and bold versions, and a code or identifier representing the class of the character "a". So the templates for "a" lowercase in different fonts contain different binary images, since the shape of the character differs from one font to another, but belong to the same class. In a first aspect, the present invention provides a pattern recognition process that comprises, from an input form: a) normalizing the input form into a standardized form of "predetermined size; generating a reliable form of the normalized form using at least one morphological operator; c) calculating a distance between the trusted form and the selected models that are selected from a model library in which each model belongs to a class (d) classifying the reliable form in at least one of the classes of the selected models by means of at least one nonparametric classification method that uses said selected model classes and said calculated distances as inputs and gives identified classes as well as levels of trust. It has been found that by combining these steps, a pattern recognition method can be obtained which provides reliable levels of confidence without being restricted to a limited number of fonts. In the first place, the reliability of the shape to be recognized is notably improved by the normalization and the morphological operator. In addition, calculated distances between reliable forms and models are transformed into class confidence levels by the nonparametric classification method. As the classification continues to provide confidence levels, these can be taken into account in subsequent steps to decide between, for example, a combination of characters or another based on, for example, a contextual analysis ( look in the dictionary, compare character sizes, etc.). In embodiments of the invention, the normalization may include, for example for characters, normalization of an input image, character to a binary image which may have a selected width and height between a height and a predefined width. The normalization process can also increase the thickness of character lines for thin lines and decrease the thickness for thick lines. In embodiments according to the invention, the generation of reliable forms from the normalized form using one or more morphological operators may, for example, include the use of expansion and erosion operators. Mathematical morphology is a theory and a technique for analyzing and processing geometric structures. The basic idea in the binary morphology is to probe an image with a predefined simple form and to draw conclusions about the extent to which this shape corresponds or not to the shapes in the image. This simple "probe" is called structuring element and is itself a binary image. Further information can be found in Serra, J. et al, "Mathematical morphology and its applications to image processing", Kluwer Academie Publishers, 1994, which is incorporated by reference in its entirety herein. In embodiments according to the invention, the calculation of a distance (which is a measure for the mutual difference) between the reliable forms and the selected models can be performed by means of a decision tree. In the case of characters, the distance may be, for example, the number of pixels that are different, but other distances are possible. Instead of calculating the distances of the reliable shapes with respect to each of the shape models as is done in the prior art, the present invention describes embodiments with a decision tree that can speed up this calculation without the need for precision. In embodiments according to the invention, the classification of reliable forms using one or more non-parametric classification methods may include, for example, one of the following non-parametric classification methods: K nearest neighbors (KNN); Parzen, probabilistic neural network, radial base function (RBF). The use of nonparametric classification methods for pattern recognition has the advantage of assuming that the forms of the underlying density function are not known.The density functions p (x / wi ) can be estimated from sample forms (eg Parzen windows - probabilistic neural network) Other methods can be used which directly estimate the posterior probabilities (the k nearest neighbors). information in Ripley, BD, "Pattern Recognition and Neural Networks", Cambridge University Press, 1996, ISBN 0 521 46086 7, which is incorporated herein by reference in its entirety. In other aspects, the invention relates to a computer program comprising software code fragments for performing the steps of the method described herein, which are stored on a storage medium and intended to be loaded into a computing device memory. for execution; or a computer device comprising such a computer program such as for example a mobile terminal or a digital copier; or a computer device having a computer program for performing certain steps described herein and provided for communicating with an external server that is intended to perform other steps described herein. Brief Description of the Drawings The invention will be explained in more detail by way of the following description and the accompanying drawings. Figure 1 shows a block diagram of a feature-based character recognition method of the prior art. Figure 2 shows a block diagram of an embodiment of a character recognition method according to the invention. Figure 3 shows a block diagram of a form of performing a normalization step of a character recognition method according to the invention. Fig. 4 shows a block diagram of one embodiment of generating reliable first and second binary images of a character recognition method according to the invention. Fig. 5 shows a block diagram of an embodiment of calculating the distances between an examined character and patterns, which may be used in a character recognition method according to the invention. Figure 6 shows a block diagram of the non-parametric classification that can be used in a character recognition method according to the invention. Figure 7 shows a block diagram of one embodiment of calculating the distances between the examined character and the models using a soft decision decision tree. Figure 8 also illustrates an embodiment of generating reliable first and second binary images. Figure 9 further illustrates the comparison between a reliable black pixel image of a character and a reliable bit image of white pixels of a model. Figure 10 further illustrates a binary decision tree. Figure 11 shows the calculation of the current distance around a node of the binary decision tree. Figure 12 illustrates a binary image of an underlined character and the binary companion image of hidden pixels. Modes of implementing the invention The present invention will be described in connection with particular embodiments and with reference to certain drawings, but the invention is however not limited thereto but only by the claims The drawings described are only schematic and are nonlimiting. In addition, the terms first, second, third and similar in the description and in the claims are used to distinguish between similar elements and not necessarily to describe a sequential or chronological order. The terms are interchangeable under the appropriate circumstances and the embodiments of the invention may operate in other sequences than those described or illustrated herein. In addition, the various embodiments, although referred to as "preferred", should be interpreted as exemplary ways in which the invention can be implemented rather than as limiting the scope of the invention. The term "comprising", as used in the claims, should not be construed as being limited to the means or steps listed below; it does not exclude other elements or steps. It must be interpreted as specifying the presence of the elements, integers, steps or components referred to but does not exclude the presence or addition of one or more other elements, integers, steps or components or groups of these. Therefore, the scope of the term "a device comprising A and B" should not be limited to devices comprising only components A and B, on the contrary, with respect to the present invention, the only listed components of the device are A and B, and the claim should also be interpreted to include the equivalents of these components. An embodiment of a character recognition method according to the invention, shown in FIG. 2, comprises the following steps, starting with a character input image 201: (a) normalizing 202 of the character into a binary image having a selected width and height between pre-defined height and width. This normalization process can also increase the thickness of "character traits for thin lines and decrease thickness for thick lines; (b) generation of reliable binary images 203 from the normalized character binary image to using morphological operators such as, for example, dilation and erosion operators, (c) calculating a distance 204 between the reliable bit images and the selected models, (d) classifying 206 using one or more methods of non-parametric classifications such as, for example, those of the k nearest neighbors (KNN), Parzen windows, probabilistic neural network, radial base function (RBF) which give classes and levels of confidence as output 207 . In the following, preferred embodiments of these steps will be described in more detail. Image normalization The normalization process, shown in Fig. 3, transforms the character input image into a binary image 307 having a fixed width and height. Width and height are selected at step 303 in a list 304 of predefined widths and heights. The selected width and height are those that have the width / height ratio closest to the character width / height ratio that is calculated in step 302. For example, predefined width and height could be 24x30, 16x30, or 8x30. For the recognition of small symbols such as the dot, comma or quotation marks, the predefined width and height may be smaller. The normalization process can thicken the thin lines, for example lines of 1 or 2 pixels are thickened to 3 pixels. The normalization process can also thin thick lines by thickening, for example, thin white flows. The input image of the character may be binary or grayscale but the normalized image 307 is binary, a pixel having a value of 1 for black and 0 for white (or vice versa). The process may also provide a companion binary image called hidden pixels binary image of 0 for "do not know" and 1 otherwise. This binary image may be used to process a character crossed by an underline or other graphic element (see example in Figure 12). In this case, the color of some pixels is not known and the value "do not know" is given. Generation of reliable binary images Pixels surrounded by other pixels of the same color are more reliable than pixels at the edge of the characters. Using the morphological erosion operator (see examples in Figures 8 and 9), a binary image with reliable black pixels can be formed and using the morphological erosion operator on the inverted binary image, an image binary with reliable white pixels can be formed. The structuring element of the erosion operation may be, for example, a 3x3 black square or a cross with a horizontal line and a vertical line each having 3 black pixels. In one embodiment, shown in Fig. 4, a first set of reliable bit images of black and white pixels 403, 406 are respectively generated by erosion 402 of a character input image 401 (which may be the normalized binary image 307 generated in FIG. 3) and by inversion 404 and erosion 405 of this image 401. A second additional set of reliable bit images of black and white pixels 409, 410 are created by erosion of the first set of images reliable binaries through subsequent erosion steps 407, followed by OR step 408. In this embodiment, 4 structural elements are used for the erosion steps 407: a horizontal line, a vertical line and two diagonal lines of 3 black pixels. It should be noted that erosion on a binary image can be performed very quickly using logical operations on machine bytes. models A model 205 in embodiments of the invention may include a predefined number of binary images having a predefined width and height and the associated class (e.g., character identification). In a first embodiment, a template contains the binary image of a character that has been normalized to a predefined width and height. Reliable bitmaps can be generated during the character recognition process using the same operations as for the examined character. In a second embodiment, a template contains pre-computed reliable binary images using the same operations as for the examined character. In a third embodiment, the reliable bit images contain pre-calculated reliable binary images but aggregating different examples of binary character images. This can be done to reduce the number of models. For example, a normal version and a bold version of a character may be in the same template. The distance calculation between the examined character and a model. An embodiment of the distance calculation step 204 is shown in FIG. 5. Models are selected in steps 501, 502 and exclusive OR (XOR) operations 503 are performed between reliable binary images of the examined character and binary images. reliable model. Exclusive OR (XOR) operations are performed between the "reliable black pixel" and "white pixel" reliable bit images The number of "lit" pixels in the result is used to calculate a distance at step 504. In the preferred embodiment, if: - A1b is the first reliable black pixel image of the examined character; A1w is the first reliable bit image of white pixels of the examined character; T1b is the first reliable black pixel image of the model; T1w is the first reliable bit image of white pixels of the model; A2b is the first reliable black pixel image of the examined character; A2w is the first reliable bit image of white pixels of the examined character; - T2b is the first reliable binary image of black pixels of the model; - T2w is the first reliable bit image of white pixels of the model; the following operations are performed in step 503: R1 = (A1b XOR T1w) OR (A1w XOR T1b) R2 = (A2b XOR T2w) OR (A2w XOR T2b) Distance = # (R1) +4 x # (R2) with # () the function that calculates the number of "on" pixels. In one embodiment, A1h the binary image of hidden pixels is used and an AND operation is performed to calculate RT and R2 'RT = R1 AND A1h R2' = T2 AND, A1h Distance = # (RT) + 4 x # (R2 ') with # () the function that calculates the number of "on" pixels. It should be noted that exclusive OR (XOR), OR (OR) and AND (AND) operations on a binary image are performed very quickly using the corresponding logic operations on machine bytes. Non parametric classification The classification is performed using a non-parametric classification method that gives classes and confidence levels, see Figure 6. Input 601 consists of the selected classes and the distances obtained by the process of Figure 5. This is subjected to the non-parametric classification step 602 and the resulting output 603 consists of the identified classes and confidence levels. Parametric modeling of probability density functions is based on the assumption that the forms of probability density functions are known. This knowledge typically comes from either a scientific analysis of the physical process or an empirical analysis of the observed data, for example a Gaussian distribution. What remains to be done, in statistical inference, is to estimate the parameters associated with the probability density function. The more sophisticated nonparametric density estimation that is used according to the invention includes techniques that make no assumptions about the forms of the functions of; probability density - with the exception of the weak assumption that probability density functions are smooth functions - and may represent arbitrary probability density functions with enough samples. A technique of this type is the density estimation by the Parzen windows. Other possible techniques are those of the k nearest neighbors (KNN), the probabilistic neural network or the radial base function (RBF). " The nonparametric classification is generally slower than the parametric classification, but the speed is improved in this preferred embodiment using the prioritization of possible classes, for example by giving frequently used fonts a higher priority in the classification. Unclassified or misclassified characters are used for training purposes, for example for training the classification process. For example, speed can be improved by selecting a limited number of representative models. The selection of models can be done in a training process on a large number of character samples in various fonts. Character samples are ordered from characters in the most common font to the least frequent font. A template library is first constructed with the templates corresponding to the characters of the most frequent fonts. Templates are then added to the library for unclassified or misclassified characters. Decision tree Decision trees are well known in the art. A decision is made by traversing the tree from the root node to a terminal node. At each non-terminal node of a tree, a local decision is made to select a path to a child node. This local decision is made by examining a selection of features. A decision tree is fast but suffers from a prefect loss due to the accumulation of local decision errors. In preferred embodiments of the invention, examples of which are explained by means of FIGS. 7a-b and 10, a decision tree with "soft" decisions is used to perform the steps that combine the selection of models and the distance calculation. At each node, a current distance is calculated and all nodes are visited unless the current distance exceeds a threshold. In the embodiment of Fig. 7a, "reliable" character binary images are inputted 701 and subjected to step 703 to a decision tree 702. Step 703 includes calculating distances by means of soft decisions and produces selected model classes and distances at step 704. The decision tree may be a binary tree as shown in FIG. 10, i.e. each node 1001, 1002, 1003 always has two nodes children 1002, 1003, 1004, 1005. A terminal node 1006 corresponds to a model. Each node 1001-1006 contains a list of reliable black pixels and reliable white pixels. A local distance is calculated at each node by comparing the reliable pixels of the examined character with the reliable pixels in the list. The current distance dout of this node is the current distance c /, n of the parent node incremented by the local distance Ad, see FIG. 11. At the terminal node 1006, the current distance is the distance between the character and the model and therefore the decision tree gives the same precision as direct comparison of reliable binary images. When the current distance exceeds a predefined threshold for a node, that node and all of its child nodes are discarded. The value of the threshold is chosen by balancing the expected speed and the expected accuracy. When a terminal hoop 1006 is reached, the class of the model and the distance are recorded. The class of selected models and calculated distances are used in the nonparametric classification method which gives the identified classes as well as confidence values. When no terminal node is reached, the examined character is rejected as not being a character. In the embodiment of Figure 7b, reliable bit images of character are inputted 705 and decision tree 7Ö6 only examines the pixels belonging to the second reliable bit images. The distances calculated in step 707 then correspond to 4 x # (R2) and an intermediate result is produced 708 including selected patterns and first portions of distances. In step 710, # (R1) "is then calculated by directly comparing the first reliable bit images of the examined character and the first reliable bit images of the selected template 709 to obtain the total distance, produced in step 711 with the selected model classes. i t! Construction of the decision tree In the preferred embodiment, the decision tree is constructed from bottom to top. Terminal nodes are added first. Each terminal node is associated with a model. They form a list of models. The 2 models that are most similar in this list are selected and a node is added. His children are the nodes corresponding to the selected models. A new model is built and associated with the new node: its reliable binary images contain the reliable pixels that are common to both models. Each of the 2 child nodes receives a list of reliable pixels that can only be found in its model. The models corresponding to the 2 child nodes are removed from the model list and the new model is added. The models are again examined to select the 2 models that are the most similar and another node is added in the same way. Nodes are added until the model list is empty and the decision tree is then complete. Models can then be removed from the tree. Other embodiments Embodiments according to the invention are processes, algorithms and software code for performing the steps as described herein, storage media on which these processes, algorithms and software code are stored and devices and systems to execute these processes, algorithms and software code. Embodiments of the invention may include provisions, eg software code fragments, for combining the digital input image containing one or more input forms with the digital textual information obtained by the recognition The combined file is preferably compressed in such a way that different parts or layers of the file or image are compressed with different compression algorithms optimized for the part or layer concerned. Hyper-compression is preferably used Examples of such a high compression process are disclosed, for example, in US Patent Applications US 5778092 (A) and US 2008273807 (A1) both of which are incorporated herein by reference. incorporated herein by reference in their entirety Both algorithms use a low-resolution foreground plane and a low-level background plane resolution and a high-resolution bitmap to obtain the required compression as well as a high resolution for the text. Therefore, in one embodiment of the system of the present invention, the system software code also includes code fragments for compressing the image resulting from the combination of the graphic input image and the recognized protocol using a high compression method that segments the image into bitonal data and color data and compresses the data separately with a compression method adapted to the data type. The high compression method could follow the Mixed Raster Content (MRC) model, which is the subject of the ITU-T Recommendation T.44. With the measurements taken as described herein, the invention may take the form of an execution application on a mobile terminal or a mobile terminal executing this application. The handheld can work with a Microsoft® standard Windows® operating system such as Microsoft® Windows XP® or Windows 7®, but other operating systems can also be "used as, for example, iOS, Android, Blackberry OS, Windows Phone 7, HP webOS, or whatever. The steps mentioned in this document may be applied as independent programs or may be incorporated or integrated into the pilot software or may be provided as plug-ins to cooperate with existing software applications but may also be provided in other ways known to the individual business. The steps mentioned in this document may also be performed in a distributed manner on different devices, for example some steps being performed on a mobile terminal and other steps being performed on an external server, for example using a SaaS delivery model ( Software as a Service - software as a service).
权利要求:
Claims (24) [1] A pattern recognition method comprising, beginning with an input form, the steps of: a) normalizing the input form into a standardized form of predetermined size; b) generating a reliable form from the normalized form using at least one morphological operator; c) calculating a distance between the trusted form and the selected models that are selected in a model library in which each model belongs to a class; characterized in that said selecting is performed by means of a decision tree comprising a plurality of nodes leading to a plurality of terminal nodes, each node containing a list of reliable black pixels and reliable white pixels and each terminal node forming one of said nodes selected models; in that said distance calculation comprises: calculating a local distance at each node by comparing the reliable pixels of said reliable form with said reliable black and white pixels in said list; incrementing at each node a current distance by the local distance, so that the current distance at the terminal node is said distance between the trusted form and the selected model; and eliminating nodes for which said current distance exceeds a predetermined threshold; and in that the method further comprises a step of: d) classifying the reliable form in at least one of the classes of the selected models by means of at least one nonparametric classification method which uses said classes of the selected models and said calculated distances as inputs and product of identified classes as well as confidence levels. [2] A pattern recognition method as in claim 1, wherein the normalization comprises a change in the thickness of the character strokes of the input form. [3] The pattern recognition method as in claim 2, wherein the change in thickness comprises an increase in thickness for thin character lines and a decrease in thickness for thick character traits. [4] 4. A pattern recognition method as in claim 1, wherein one of said at least one morphological operator is erosion. [5] A method of pattern recognition as in claim 4, wherein a first reliable form is generated by said erosion and wherein a second reliable form is generated by erosion of the first reliable form. [6] The method of pattern recognition as in claim 4, wherein said erosion is performed using logic operations on machine bytes. [7] The method of pattern recognition as in claim 1, wherein one of said at least one morphological operator is the dilation. [8] The pattern recognition method as in claim 1, wherein said selection of patterns in the model library is performed by a pattern matching process. [9] The pattern recognition method as in claim 1, wherein said selection of templates in the template library is performed by means of a decision tree followed by a pattern matching process. [10] The method of pattern recognition as in claim 1, wherein said at least one non-parametric classification method is selected from the group consisting of: Parzen window density estimation, K nearest neighbors, probabilistic neural network , radial basic function. [11] The method of pattern recognition as in claim 1, wherein said classification comprises prioritizing a predetermined set of patterns from all possible patterns. [12] The method of pattern recognition as in claim 11, wherein said prioritization comprises assigning to patterns corresponding to frequently used character fonts a higher priority. [13] A computer program product that can be directly loaded into a computer memory, includes software code portions for executing, beginning with an input form, the steps of: a) normalizing the form of entering a standardized form of predetermined size; b) generating a reliable form from the normalized form using at least one morphological operator; c) calculating a distance between the trusted form and the selected models that are selected in a model library in which each model belongs to a class; d) classifying the reliable form in at least one of the classes of the selected models using at least one non-parametric classification method that uses said classes of the selected models and said distances calculated as inputs and outputs of the identified classes as well as levels of confidence. [14] The computer program product as in claim 13, wherein the normalization comprises a change in the thickness of the character strokes of the input form, wherein the change in thickness comprises an increase of the Thickness for thin character lines and a decrease in thickness for thick character traits. [15] The computer program product as in claim 13, wherein one of said at least one morphological operator is erosion whereby a first reliable form is generated and wherein a second reliable form is generated by erosion of the first form reliable. [16] The computer program product as in claim 15, wherein said erosion is performed using logic operations on machine bytes. [17] The computer program product as in claim 13, wherein said selection of models in the model library is performed by means of a decision tree. [18] The computer program product as in claim 17, wherein said decision tree comprises a plurality of nodes leading to a plurality of terminal nodes, each node containing a list of reliable black pixels and reliable white pixels and each node. terminal forming one of said selected models; wherein said distance calculation comprises: calculating a local distance to each node by comparing the reliable pixels of said reliable form with said reliable black and white pixels in said list; incrementing at each node a current distance by the local distance, so that the current distance to the terminal node is said distance between the trusted form and the selected model; and wherein said model selection comprises eliminating nodes at which said current distance exceeds a predetermined threshold. [19] The computer program product as in claim 13, wherein said at least one non-parametric classification method is selected from the group consisting of: Parzen window density estimation, K nearest neighbors, neural network probabilistic, radial basic function. [20] The computer program product as in claim 13, wherein said classification comprises prioritizing a predetermined set of patterns from all possible models, wherein said prioritizing comprises assigning to the models corresponding to fonts. frequently used higher priority. [21] 21. Computer program product according to claim 13, stored on a medium usable by a computer. [22] A mobile terminal having a pattern recognition method executable on said mobile terminal and including software code portions for executing, starting with an input form, the steps of: a) normalizing the input form into a standardized form of predetermined size; b) generating a reliable form of the normalized form using at least one morphological operator; c) calculating a distance between the trusted form and the selected models that are selected in a model library in which each model belongs to a class; d) classifying the reliable form in at least one of the classes of the selected models using at least one non-parametric classification method that uses said classes of the selected models and said distances calculated as inputs and outputs of the identified classes as well as levels of confidence. [23] 23. Mobile terminal according to claim 22, said mobile terminal being a smartphone. [24] 24. Mobile terminal according to claim 22, said mobile terminal being a tablet computer.
类似技术:
公开号 | 公开日 | 专利标题 US20190130232A1|2019-05-02|Font identification from imagery BE1024194B1|2017-12-12|Method for identifying a character in a digital image BE1022562B1|2016-06-02|Optical character recognition method US8463054B2|2013-06-11|Hierarchical OCR using decision tree and nonparametric classifier BE1025504A1|2019-03-20|Pattern recognition system FR2974433A1|2012-10-26|EVALUATION OF IMAGE QUALITY BE1025503B1|2019-03-27|LINE SEGMENTATION METHOD KR20200055760A|2020-05-21|Image content recognition method and device JP2013206187A|2013-10-07|Information conversion device, information search device, information conversion method, information search method, information conversion program and information search program BE1025502B1|2019-03-27|SYSTEM AND METHOD FOR FORM RECOGNITION USING GABOR FUNCTIONS BE1020588A5|2014-01-07|METHOD OF RECOGNIZING FORMS, COMPUTER PROGRAM PRODUCT, AND MOBILE TERMINAL. JP6004014B2|2016-10-05|Learning method, information conversion apparatus, and learning program BE1026039B1|2019-12-13|IMAGE PROCESSING METHOD AND IMAGE PROCESSING SYSTEM EP3506170A2|2019-07-03|Method for forming a neural network for recognising a sequence of characters and associated recognition method CN110310114B|2020-09-01|Object classification method, device, server and storage medium KR102018788B1|2019-11-04|Image classfication system and mehtod Carrasco et al.2020|Laconic Image Classification: Human vs. Machine Performance JP2008123245A|2008-05-29|Image processor and image processing program BE1022166B1|2016-02-23|IMAGE COMPRESSION METHOD EP2180436B1|2017-08-16|Semi-supervised learning method system for data classification according to discriminating parameters EP1390905A1|2004-02-25|Method for detecting text zones in a video image Shah et al.2020|Deep Learning model-based Multimedia forgery detection BE1025134B1|2018-11-16|Method of identifying a character in a digital image US20210304364A1|2021-09-30|Method and system for removing noise in documents for image processing BE1025006B1|2018-09-25|COMPUTER-IMPLEMENTED PROCESS AND OPTICAL CHARACTER RECOGNITION SYSTEM
同族专利:
公开号 | 公开日
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 WO1990003012A2|1988-09-07|1990-03-22|Harry James Etherington|Image recognition|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201161522274P| true| 2011-08-11|2011-08-11| US201161522274|2011-08-11| US201213427062|2012-03-22| US13/427,062|US9496359B2|2011-03-28|2012-03-22|Integrated circuit having chemically modified spacer surface| US13/442,192|US8463054B2|2011-08-11|2012-04-09|Hierarchical OCR using decision tree and nonparametric classifier| US201213442192|2012-04-09| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|